Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural Networks - RCR Report

نویسندگان

چکیده

This is the Replicated Computational Results (RCR) Report for our TOSEM paper ”Adopting Two Supervisors Efficient Use of Large-Scale Remote Deep Neural Networks”, where we propose a novel client-server architecture allowing to leverage high accuracy huge neural networks running on remote servers while reducing economical and latency costs typically coming from using such models. As part this RCR, provide replication package, which allows full all results specifically designed facilitate reuse.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-Scale YouTube-8M Video Understanding with Deep Neural Networks

Video classification problem has been studied many years. The success of Convolutional Neural Networks (CNN) in image recognition tasks gives a powerful incentive for researchers to create more advanced video classification approaches. As video has a temporal content Long Short Term Memory (LSTM) networks become handy tool allowing to model long-term temporal clues. Both approaches need a large...

متن کامل

Large Scale Distributed Deep Networks

Recent work in unsupervised feature learning and deep learning has shown that being able to train large models can dramatically improve performance. In this paper, we consider the problem of training a deep network with billions of parameters using tens of thousands of CPU cores. We have developed a software framework called DistBelief that can utilize computing clusters with thousands of machi...

متن کامل

Decentralized Adaptive Control of Large-Scale Non-affine Nonlinear Time-Delay Systems Using Neural Networks

In this paper, a decentralized adaptive neural controller is proposed for a class of large-scale nonlinear systems with unknown nonlinear, non-affine subsystems and unknown nonlinear time-delay interconnections. The stability of the closed loop system is guaranteed through Lyapunov-Krasovskii stability analysis. Simulation results are provided to show the effectiveness of the proposed approache...

متن کامل

Efficient Model Averaging for Deep Neural Networks

Large neural networks trained on small datasets are increasingly prone to overfitting. Traditional machine learning methods can reduce overfitting by employing bagging or boosting to train several diverse models. For large neural networks, however, this is prohibitively expensive. To address this issue, we propose a method to leverage the benefits of ensembles without explicitely training sever...

متن کامل

Efficient Inferencing of Compressed Deep Neural Networks

Large number of weights in deep neural networks makes the models difficult to be deployed in low memory environments such as, mobile phones, IOT edge devices as well as “inferencing as a service” environments on cloud. Prior work has considered reduction in the size of the models, through compression techniques like pruning, quantization, Huffman encoding etc. However, efficient inferencing usi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ACM Transactions on Software Engineering and Methodology

سال: 2023

ISSN: ['1049-331X', '1557-7392']

DOI: https://doi.org/10.1145/3617594